6 research outputs found

    A Silicon Model of Early Visual Processing

    Get PDF
    Many of the most striking phenomena known from perceptual psychology are a direct result of the first levels of neural processing. In the visual systems of higher animals, the well-known center-surround response to local stimuli is responsible for some of the strongest visual illusions. For example, Mach bands, the Hermann-Hering grid illusion, and the Craik-O'Brian-Comsweet illusion can all be traced to simple inhibitory interactions between elements of the retina (Ratliff 1965). The high degree to which a perceived image is independent of the absolute illumination level can be viewed as a property of the mechanism by which incident light is transduced into an electrical signal. We present a model of the first stages of retinal processing in which these phenomena are viewed as natural by-products of the mechanism by which the system adapts to a wide range of viewing conditions. Our retinal model is implemented as a single silicon chip, which contains integrated photoreceptors and processing elements; this chip generates, in real time, outputs that correspond directly to signals observed in the corresponding levels of biological retinas

    Implementing neural architectures using analog VLSI circuits

    Get PDF
    Analog very large-scale integrated (VLSI) technology can be used not only to study and simulate biological systems, but also to emulate them in designing artificial sensory systems. A methodology for building these systems in CMOS VLSI technology has been developed using analog micropower circuit elements that can be hierarchically combined. Using this methodology, experimental VLSI chips of visual and motor subsystems have been designed and fabricated. These chips exhibit behavior similar to that of biological systems, and perform computations useful for artificial sensory systems

    The Silicon Retina

    No full text
    A chip based on the neural architecture of the eye proves a new, more powerful way of doing computations

    Hybrid analog-digital architectures for neuromorphic systems

    No full text
    Abstract- Signal restoration is necessary to perform computations of significant com-plexity. In digital computers each state vari-able is restored to a binary value, but this strategy is incompatible with analog compu-tation. Nevertheless, cortical neurons, whose major mode of operation is analog, are able to perform prodigious feats of computation. Our research on visual cortex suggests that cortical neurons are able to compute reliably because they are organized into populations in which the signal at each neuron is restored to an appropriate analog value by a collec-tive strategy. The strategy depends on feed-back amplification that restores an input sig-nal towards a stored analog memory. This principle is similar to recall by autoassocia-tive neural networks. Networks of cortical amplifiers can solve simple visual processing tasks. They are well-suited to sensory pro-cessing because the same principle that re-stores their analog signals can also extract meaningful features from ambiguous sensory input. We describe a hybrid analog-digital CMOS architecture for constructing networks of cortical amplifiers. This neuromorphic ar-chitecture is a step towards exploring analog computers whose distributed signal restora-tion permits them to perform reliably sequen-tial computations of great depth. I

    Recurrent excitation in neocortical cells

    No full text
    The majority of synapses in the mammalian cortex originate from cortical neurons. Indeed, the largest input to cortical cells comes from neighboring excitatory cells. However, most models of cortical development and processing do not reflect the anatomy and physiology of feedback excitation and are restricted to serial feedforward excitation. This report describes how populations of neurons in cat visual cortex can use excitatory feedback, characterized as an effective "network conductance", to amplify their feedforward input signals and demonstrates how neuronal discharge can be kept proportional to stimulus strength despite strong, recurrent connections that threaten to cause runaway excitation. These principles are incorporated into models of cortical direction and orientation selectivity that emphasize the basic design principles of cortical architectures

    Spiking Neural Computing in Memristive Neuromorphic Platforms

    Get PDF
    International audienceAbstract Neuromorphic computation using Spiking Neural Networks (SNN) is pro-posed as an alternative solution for future of computation to conquer the memorybottelneck issue in recent computer architecture. Different spike codings have beendiscussed to improve data transferring and data processing in neuro-inspired compu-tation paradigms. Choosing the appropriate neural network topology could result inbetter performance of computation, recognition and classification. The model of theneuron is another important factor to design and implement SNN systems. The speedof simulation and implementation, ability of integration to the other elements of thenetwork, and suitability for scalable networks are the factors to select a neuron model.The learning algorithms are significant consideration to train the neural network forweight modification. Improving learning in neuromorphic architecture is feasibleby improving the quality of artificial synapse as well as learning algorithm such asSTDP. In this chapter we proposed a new synapse box that can remember and forget.Furthermore, as the most frequent used unsupervised method for network training inSNN is STDP, we analyze and review the various methods of STDP. The sequentialorder of pre- or postsynaptic spikes occurring across a synapse in an interval of timeleads to defining different STDP methods. Based on the importance of stability aswell as Hebbian competition or anti-Hebbian competition the method will be usedin weight modification. We survey the most significant projects that cause makingneuromorphic platform. The advantages and disadvantages of each neuromorphicplatform are introduced in this chapter
    corecore